Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Quantification of all types of uncertainty helps to establish reliability in any analysis. This research focuses on uncertainty in two attribute levels of wetland classification and creates visualization tools to guide analysis of spatial uncertainty patterns over several scales. A novel variant of confusion matrix analysis compares the Cowardin and Hydrogeomorphic wetland classification systems, identifying areas and types of misclassification for binary and multivariate categories. The specific focus on uncertainty in the paper refers to categorical consistency, that is, agreement between the two classification systems, rather than comparing observed data to ground truth. Consistency is quantified using confusion matrix analysis. Aggregation across progressive focal windows transforms the confusion matrix into a multiscale data pyramid for quick determination of where attribute uncertainty is highly variant, and at what spatial resolutions classification inconsistencies emerge. The focal pyramids summarize precision, recall, and F1 scores to visualize classification differences across spatial scales. Findings show that the F1 scores appear most informative on agreement about wetlands misclassification at both coarse and fine attribute scales. The pyramid organizes multi-scale uncertainty in a single unified framework and can be “sliced” to view individual focal levels of attribute consistency. Results demonstrate how the confusion matrix can be used to quantify the percentage of a study area in which inconsistencies occur reflecting wetland presence and type. The research provides confusion metrics and display tools to focus attention on specific areas of large data sets where attribute uncertainty patterns may be complex, thus reducing land managers’ workloads by highlighting areas of uncertainty where field checking might be appropriate, and improving analytics by providing visualization tools to quickly see where such areas occur.more » « less
-
Many spatial analysis methods suffer from the scaling issue identified as part of the Modifiable Areal Unit Problem (MAUP). This article introduces the Pyramid Model (PM), a hierarchical data framework integrating space and spatial scale in a 3D environment to support multi-scale analysis. The utility of the PM is tested in examining quadrat density and kernel density, which are commonly used measures of point patterns. The two metrics computed from a simulated point set with varying scaling parameters (i.e. quadrats and bandwidths) are represented in the PM. The PM permits examination of the variation of the density metrics computed at all different scales. 3D visualization techniques (e.g. volume display, isosurfaces, and slicing) allow users to observe nested relations between spatial patterns at different scales and understand the scaling issue and MAUP in spatial analysis. A tool with interactive controls is developed to support visual exploration of the internal patterns in the PM. In addition to the point pattern measures, the PM has potential in analyzing other spatial indices, such as spatial autocorrelation indicators, coefficients of regression analysis and accuracy measures of spatial models. The implementation of the PM further advances the development of a multi-scale framework for spatio-temporal analysis.more » « less
-
Distance is the most fundamental metric in spatial analysis and modeling. Planar distance and geodesic distance are the common distance measurements in current geographic information systems and geospatial analytic tools. However, there is little understanding about how to measure distance in a digital terrain surface and the uncertainty of the measurement. To fill this gap, this study applies a Monte‐Carlo simulation to evaluate seven surface‐adjustment methods for distance measurement in digital terrain model. Using parallel computing techniques and a memory optimization method, the processing time for the distances calculation of 6,000 simulated transects has been reduced to a manageable level. The accuracy and computational efficiency of the surface‐adjustment methods were systematically compared in six study areas with various terrain types and in digital elevation models in different resolutions. Major findings of this study indicate a trade‐off between measurement accuracy and computational efficiency: calculations at finer resolution DEMs improve measurement accuracy but increase processing times. Among the methods compared, the weighted average demonstrates highest accuracy and second fastest processing time. Additionally, the choice of surface adjustment method has a greater impact on the accuracy of distance measurements in rougher terrain.more » « less
An official website of the United States government
